Robust Low-Rank Modelling on Matrices and Tensors

نویسندگان

  • Georgios Papamakarios
  • Stefanos Zafeiriou
  • Christos Sagonas
چکیده

Robust low-rank modelling has recently emerged as a family of powerful methods for recovering the low-dimensional structure of grossly corrupted data, and has become successful in a wide range of applications in signal processing and computer vision. In principle, robust low-rank modelling focuses on decomposing a given data matrix into a low-rank and a sparse component, by minimising the rank of the former and maximising the sparsity of the latter. In several practical cases, however, tensors can be a more suitable representation than matrices for higher-order data. Nevertheless, existing work on robust low-rank modelling has mostly focused on matrices and has largely neglected tensors. In this thesis we aim to bridge the gap between matrix and tensor methods for robust lowrank modelling. We conduct a thorough survey on matrix methods, in which we discuss and analyse the state-of-the-art algorithms. We take one step further and we present—for the first time in the literature—a set of novel algorithms for robust low-rank modelling on tensors. We show how both matrix and tensor methods can be extended to deal with missing values and we discuss non-convex generalisations thereof. Finally, we demonstrate the applicability of all methods, both those already existing and those proposed herein, to a range of computer vision problems of practical interest.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multilinear Low-Rank Tensors on Graphs & Applications

We propose a new framework for the analysis of lowrank tensors which lies at the intersection of spectral graph theory and signal processing. As a first step, we present a new graph based low-rank decomposition which approximates the classical low-rank SVD for matrices and multilinear SVD for tensors. Then, building on this novel decomposition we construct a general class of convex optimization...

متن کامل

Discretized Dynamical Low-Rank Approximation in the Presence of Small Singular Values

Low-rank approximations to large time-dependent matrices and tensors are the subject of this paper. These matrices and tensors are either given explicitly or are the unknown solutions of matrix and tensor differential equations. Based on splitting the orthogonal projection onto the tangent space of the low-rank manifold, novel time integrators for obtaining approximations by low-rank matrices a...

متن کامل

Discretized Dynamical Low-rank Approximation in the Presence of Small Singular

Low-rank approximations to large time-dependent matrices and tensors are the subject of this paper. These matrices and tensors either are given explicitly or are the unknown solutions of matrix and tensor differential equations. Based on splitting the orthogonal projection onto the tangent space of the low-rank manifold, novel time integrators for obtaining approximations by low-rank matrices a...

متن کامل

Orthogonal Rank-two Tensor Approximation: a Modified High-order Power Method and Its Convergence Analysis

With the notable exceptions that tensors of order 2, that is, matrices always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-one approximation, it is known that high-order tensors can fail to have best low rank approximations. When the condition of orthogonality is imposed, even at the most general case that only one pair of components in...

متن کامل

Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem

There has been continued interest in seeking a theorem describing optimal low-rank approximations to tensors of order 3 or higher, that parallels the Eckart–Young theorem for matrices. In this paper, we argue that the naive approach to this problem is doomed to failure because, unlike matrices, tensors of order 3 or higher can fail to have best rank-r approximations. The phenomenon is much more...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014